Simplicity Theory
   HOME

TheInfoList



OR:

Simplicity theory is a cognitive theory that seeks to explain the attractiveness of situations or events to human minds. It is based on work done by scientists like behavioural scientist
Nick Chater Nick Chater is Professor of Behavioural Science at Warwick Business School, who works on rationality and language using a range of theoretical and experimental approaches. Education Chater read Psychology at Cambridge University. He first wor ...
, computer scientist
Paul Vitanyi Paul may refer to: *Paul (given name), a given name (includes a list of people with that name) *Paul (surname), a list of people People Christianity * Paul the Apostle (AD c.5–c.64/65), also known as Saul of Tarsus or Saint Paul, early Chri ...
, psychologist Jacob Feldman, and
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by animals and humans. Example tasks in which this is done include speech re ...
researchers Jean-Louis Dessalles Dessalles, J.-L. (2013)
"Algorithmic simplicity and relevance"
In D. L. Dowe (Ed.), Algorithmic probability and friends - LNAI 7070, 119-130. Berlin, D: Springer Verlag.
and
Jürgen Schmidhuber Jürgen Schmidhuber (born 17 January 1963) is a German computer scientist most noted for his work in the field of artificial intelligence, deep learning and artificial neural networks. He is a co-director of the Dalle Molle Institute for Artif ...
. It claims that interesting situations appear simpler than expected to the observer.


Overview

Technically, simplicity corresponds in a drop in
Kolmogorov complexity In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produ ...
, which means that, for an observer, the shortest description of the situation is shorter than anticipated. For instance, the description of a consecutive lottery draw, such as 22-23-24-25-26-27, is significantly shorter than a typical one, such as 12-22-27-37-38-42. The former requires only one instantiation (choice of the first lottery number), whereas the latter requires six instantiations. Simplicity theory makes several quantitative predictions concerning the way atypicality, Maguire, P., Moser, P. & Maguire, R. (2019)
"Seeing patterns in randomness: a computational model of surprise"
''Topics in Cognitive Science'', 11 (1), 103-118.
distance, recency or prominence (places, individuals) influence interestingness.


Formalization

The basic concept of simplicity theory is ''unexpectedness'', defined as the difference between expected complexity and observed complexity: :U = C_\text - C_\text. This definition extends the notion of ''randomness deficiency''. In most contexts, C_\text corresponds to ''generation'' or ''causal'' complexity, which is the smallest description of all parameters that must be set in the "world" for the situation to exist. In the lottery example, generation complexity is identical for a consecutive draw and a typical draw (as long as no cheating is imagined) and amounts to six instantiations. Simplicity theory avoids most criticisms addressed at
Kolmogorov complexity In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produ ...
by considering only descriptions that are ''available'' to a given ''observer'' (instead of any imaginable description). This makes complexity, and thus unexpectedness, observer-dependent. For instance, the typical draw 12-22-27-37-38-42 will appear very simple, even simpler than the consecutive one, to the person who played that combination.


Connection with probability

Algorithmic probability In algorithmic information theory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in induc ...
is defined based on
Kolmogorov complexity In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produ ...
:Solomonoff, R. J. (1964)
"A Formal Theory of Inductive Inference
''Information and Control'', 7 (1), 1-22.
complex objects are less probable than simple ones. The link between complexity and probability is reversed when probability measures surprise and unexpectedness: simple events appear ''less'' probable than complex ones. Unexpectedness U is linked to
subjective probability Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification o ...
P as :P = 2^. The advantage of this formula is that subjective probability can be assessed without necessarily knowing the alternatives. Classical approaches to (objective) probability consider sets of events, since fully instantiated individual events have virtually zero probability to have occurred and to occur again in the world.
Subjective probability Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification o ...
concerns individual events. Simplicity theory measures it based on randomness deficiency, or complexity drop. This notion of subjective probability does not refer to the event itself, but to what makes the event unique.


References

{{Reflist


External links


A tutorial on Simplicity Theory


Cognitive science